99 research outputs found

    A picture is worth a thousand words : content-based image retrieval techniques

    Get PDF
    In my dissertation I investigate techniques for improving the state of the art in content-based image retrieval. To place my work into context, I highlight the current trends and challenges in my field by analyzing over 200 recent articles. Next, I propose a novel paradigm called __artificial imagination__, which gives the retrieval system the power to imagine and think along with the user in terms of what she is looking for. I then introduce a new user interface for visualizing and exploring image collections, empowering the user to navigate large collections based on her own needs and preferences, while simultaneously providing her with an accurate sense of what the database has to offer. In the later chapters I present work dealing with millions of images and focus in particular on high-performance techniques that minimize memory and computational use for both near-duplicate image detection and web search. Finally, I show early work on a scene completion-based image retrieval engine, which synthesizes realistic imagery that matches what the user has in mind.LEI Universiteit LeidenNWOImagin

    Numerical methods for hyperbolic and parabolic integro-differential equations

    Get PDF
    An analysis by energy methods is given for fully discrete numerical methods for time-dependent partial integro-differential equations. Stability and error estimates are derived in H1 and L2. The methods considered pay attention to the storage needs during time-stepping

    Galerkin FEM for fractional order parabolic equations with initial data in H−s, 0<s≤1H^{-s},~0 < s \le 1

    Full text link
    We investigate semi-discrete numerical schemes based on the standard Galerkin and lumped mass Galerkin finite element methods for an initial-boundary value problem for homogeneous fractional diffusion problems with non-smooth initial data. We assume that Ω⊂Rd\Omega\subset \mathbb{R}^d, d=1,2,3d=1,2,3 is a convex polygonal (polyhedral) domain. We theoretically justify optimal order error estimates in L2L_2- and H1H^1-norms for initial data in H−s(Ω), 0≤s≤1H^{-s}(\Omega),~0\le s \le 1. We confirm our theoretical findings with a number of numerical tests that include initial data vv being a Dirac δ\delta-function supported on a (d−1)(d-1)-dimensional manifold.Comment: 13 pages, 3 figure

    One-Sided Position-Dependent Smoothness-Increasing Accuracy-Conserving (SIAC) Filtering Over Uniform and Non-Uniform Meshes

    Get PDF
    In this paper, we introduce a new position-dependent Smoothness-Increasing Accuracy-Conserving (SIAC) filter that retains the benefits of position dependence while ameliorating some of its shortcomings. As in the previous position-dependent filter, our new filter can be applied near domain boundaries, near a discontinuity in the solution, or at the interface of different mesh sizes; and as before, in general, it numerically enhances the accuracy and increases the smoothness of approximations obtained using the discontinuous Galerkin (dG) method. However, the previously proposed position-dependent one-sided filter had two significant disadvantages: (1) increased computational cost (in terms of function evaluations), brought about by the use of 4k+14k+1 central B-splines near a boundary (leading to increased kernel support) and (2) increased numerical conditioning issues that necessitated the use of quadruple precision for polynomial degrees of k≥3k\ge 3 for the reported accuracy benefits to be realizable numerically. Our new filter addresses both of these issues --- maintaining the same support size and with similar function evaluation characteristicsas the symmetric filter in a way that has better numerical conditioning --- making it, unlike its predecessor, amenable for GPU computing. Our new filter was conceived by revisiting the original error analysis for superconvergence of SIAC filters and by examining the role of the B-splines and their weights in the SIAC filtering kernel. We demonstrate, in the uniform mesh case, that our new filter is globally superconvergent for k=1k=1 and superconvergent in the interior (e.g., region excluding the boundary) for k≥2k\ge2. Furthermore, we present the first theoretical proof of superconvergence for postprocessing over smoothly varying meshes, and explain the accuracy-order conserving nature of this new filter when applied to certain non-uniform meshes cases. We provide numerical examples supporting our theoretical results and demonstrating that our new filter, in general, enhances the smoothness and accuracy of the solution. Numerical results are presented for solutions of both linear and nonlinear equation solved on both uniform and non-uniform one- and two-dimensional meshes

    The Multiscale Systems Immunology project: software for cell-based immunological simulation

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Computer simulations are of increasing importance in modeling biological phenomena. Their purpose is to predict behavior and guide future experiments. The aim of this project is to model the early immune response to vaccination by an agent based immune response simulation that incorporates realistic biophysics and intracellular dynamics, and which is sufficiently flexible to accurately model the multi-scale nature and complexity of the immune system, while maintaining the high performance critical to scientific computing.</p> <p>Results</p> <p>The Multiscale Systems Immunology (MSI) simulation framework is an object-oriented, modular simulation framework written in C++ and Python. The software implements a modular design that allows for flexible configuration of components and initialization of parameters, thus allowing simulations to be run that model processes occurring over different temporal and spatial scales.</p> <p>Conclusion</p> <p>MSI addresses the need for a flexible and high-performing agent based model of the immune system.</p

    Smoothness-Increasing Accuracy-Conserving (SIAC) filtering and quasi interpolation: A unified view

    Get PDF
    Filtering plays a crucial role in postprocessing and analyzing data in scientific and engineering applications. Various application-specific filtering schemes have been proposed based on particular design criteria. In this paper, we focus on establishing the theoretical connection between quasi-interpolation and a class of kernels (based on B-splines) that are specifically designed for the postprocessing of the discontinuous Galerkin (DG) method called Smoothness-Increasing Accuracy-Conserving (SIAC) filtering. SIAC filtering, as the name suggests, aims to increase the smoothness of the DG approximation while conserving the inherent accuracy of the DG solution (superconvergence). Superconvergence properties of SIAC filtering has been studied in the literature. In this paper, we present the theoretical results that establish the connection between SIAC filtering to long-standing concepts in approximation theory such as quasi-interpolation and polynomial reproduction. This connection bridges the gap between the two related disciplines and provides a decisive advancement in designing new filters and mathematical analysis of their properties. In particular, we derive a closed formulation for convolution of SIAC kernels with polynomials. We also compare and contrast cardinal spline functions as an example of filters designed for image processing applications with SIAC filters of the same order, and study their properties

    Perceived connections between information and communication technology use and mental symptoms among young adults - a qualitative study

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Prospective associations have been found between high use of information and communication technology (ICT) and reported mental symptoms among young adult university students, but the causal mechanisms are unclear. Our aim was to explore possible explanations for associations between high ICT use and symptoms of depression, sleep disorders, and stress among young adults in order to propose a model of possible pathways to mental health effects that can be tested epidemiologically.</p> <p>Methods</p> <p>We conducted a qualitative interview study with 16 women and 16 men (21-28 years), recruited from a cohort of university students on the basis of reporting high computer (n = 28) or mobile phone (n = 20) use at baseline and reporting mental symptoms at the one-year follow-up. Semi-structured interviews were performed, with open-ended questions about possible connections between the use of computers and mobile phones, and stress, depression, and sleep disturbances. The interview data were analyzed with qualitative content analysis and summarized in a model.</p> <p>Results</p> <p>Central factors appearing to explain high quantitative ICT use were personal dependency, and demands for achievement and availability originating from the domains of work, study, social life, and individual aspirations. Consequences included mental overload, neglect of other activities and personal needs, time pressure, role conflicts, guilt feelings, social isolation, physical symptoms, worry about electromagnetic radiation, and economic problems. Qualitative aspects (destructive communication and information) were also reported, with consequences including vulnerability, misunderstandings, altered values, and feelings of inadequacy. User problems were a source of frustration. Altered ICT use as an effect of mental symptoms was reported, as well as possible positive effects of ICT on mental health.</p> <p>Conclusions</p> <p>The concepts and ideas of the young adults with high ICT use and mental symptoms generated a model of possible paths for associations between ICT exposure and mental symptoms. Demands for achievement and availability as well as personal dependency were major causes of high ICT exposure but also direct sources of stress and mental symptoms. The proposed model shows that factors in different domains may have an impact and should be considered in epidemiological and intervention studies.</p

    Blau Syndrome-Associated Uveitis: Preliminary Results From an International Prospective Interventional Case Series

    Get PDF
    PURPOSE: Provide baseline and preliminary follow-up results in a 5-year longitudinal study of Blau syndrome. DESIGN: Multicenter, prospective interventional case series. METHODS: Baseline data from 50 patients from 25 centers worldwide, and follow-up data for patients followed 1, 2, or 3 years at the end of study enrollment. Ophthalmic data were collected at baseline and yearly visits by means of a standardized collection form. RESULTS: Median age at onset of eye disease was 60 months and duration of eye disease at baseline 145 months. At baseline 38 patients (78%) had uveitis, which was bilateral in 37 (97%). Eight patients (21%) had moderate to severe visual impairment. Panuveitis was found in 38 eyes (51%), with characteristic multifocal choroidal infiltrates in 29 eyes (39%). Optic disc pallor in 9 eyes (12%) and peripapillary nodules in 9 eyes (12%) were the commonest signs of optic nerve involvement. Active anterior chamber inflammation was noted in 30 eyes (40%) at baseline and in 16 (34%), 17 (57%), and 11 (61%) eyes at 1, 2, and 3 years, respectively. Panuveitis was associated with longer disease duration. At baseline, 56 eyes (75%) were on topical corticosteroids. Twenty-six patients (68%) received a combination of systemic corticosteroids and immunomodulatory therapy. CONCLUSIONS: Blau uveitis is characterized by progressive panuveitis with multifocal choroiditis, resulting in severe ocular morbidity despite continuous systemic and local immunomodulatory therapy. The frequency and severity of Blau uveitis highlight the need for close ophthalmologic surveillance as well as a search for more effective therapies

    High-Order Numerical Methods for Solving Time Fractional Partial Differential Equations

    Get PDF
    The final publication is available at Springer via http://dx.doi.org/10.1007/s10915-016-0319-1In this paper we introduce a new numerical method for solving time fractional partial differential equation. The time discretization is based on Diethelm’s method where the Hadamard finite-part integral is approximated by using the piecewise quadratic interpolation polynomials. The space discretization is based on the standard finite element method. The error estimates with the convergence order O(τ^(3−α) +h^2 ),
    • …
    corecore